Sparse Non-negative Matrix Factorization with Generalized Kullback-Leibler Divergence

نویسندگان

  • Jingwei Chen
  • Yong Feng
  • Yang Liu
  • Bing Tang
  • Wenyuan Wu
چکیده

Non-negative Matrix Factorization (NMF), especially with sparseness constraints, plays a critically important role in data engineering and machine learning. Hoyer (2004) presented an algorithm to compute NMF with exact sparseness constraints. The exact sparseness constraints depends on a projection operator. In the present work, we first give a very simple counterexample, for which the projection operator of the Hoyer (2004) algorithm fails. After analysing the reason geometrically, we fix this bug by adding some random terms and show that the fixed one works correctly. Based on the fixed projection operator, we propose another sparse NMF algorithm aiming at optimizing the generalized Kullback-Leibler divergence, hence named SNMF-GKLD. Experimental results show that SNMF-GKLD not only has similar effects with Hoyer (2004) on the same data sets, but is also efficient.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Non-negative matrix factorization for visual coding

This paper combines linear sparse coding and nonnegative matrix factorization into sparse non-negative matrix factorization. In contrast to non-negative matrix factorization, the new model can leam much sparser representation via imposing sparseness constraints explicitly; in contrast to a close model non-negative sparse coding, the new model can learn parts-based representation via fully multi...

متن کامل

Non-negative matrix factorization with fixed row and column sums

In this short note, we focus on the use of the generalized Kullback–Leibler (KL) divergence in the problem of non-negative matrix factorization (NMF). We will show that when using the generalized KL divergence as cost function for NMF, the row sums and the column sums of the original matrix are preserved in the approximation. We will use this special characteristic in several approximation prob...

متن کامل

Projective Nonnegative Matrix Factorization with α-Divergence

A new matrix factorization algorithm which combines two recently proposed nonnegative learning techniques is presented. Our new algorithm, α-PNMF, inherits the advantages of Projective Nonnegative Matrix Factorization (PNMF) for learning a highly orthogonal factor matrix. When the Kullback-Leibler (KL) divergence is generalized to αdivergence, it gives our method more flexibility in approximati...

متن کامل

Fast Parallel Randomized Algorithm for Nonnegative Matrix Factorization with KL Divergence for Large Sparse Datasets

Nonnegative Matrix Factorization (NMF) with Kullback-Leibler Divergence (NMF-KL) is one of the most significant NMF problems and equivalent to Probabilistic Latent Semantic Indexing (PLSI), which has been successfully applied in many applications. For sparse count data, a Poisson distribution and KL divergence provide sparse models and sparse representation, which describe the random variation ...

متن کامل

Csiszár's Divergences for Non-negative Matrix Factorization: Family of New Algorithms

In this paper we discus a wide class of loss (cost) functions for non-negative matrix factorization (NMF) and derive several novel algorithms with improved efficiency and robustness to noise and outliers. We review several approaches which allow us to obtain generalized forms of multiplicative NMF algorithms and unify some existing algorithms. We give also the flexible and relaxed form of the N...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2016